<html xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<meta name="Generator" content="Microsoft Word 15 (filtered medium)">
<style><!--
/* Font Definitions */
@font-face
        {font-family:"Cambria Math";
        panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
        {font-family:Calibri;
        panose-1:2 15 5 2 2 2 4 3 2 4;}
@font-face
        {font-family:-webkit-standard;
        panose-1:2 11 6 4 2 2 2 2 2 4;}
@font-face
        {font-family:"Roboto Mono";
        panose-1:2 11 6 4 2 2 2 2 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
        {margin:0in;
        margin-bottom:.0001pt;
        font-size:12.0pt;
        font-family:"Calibri",sans-serif;}
a:link, span.MsoHyperlink
        {mso-style-priority:99;
        color:#0563C1;
        text-decoration:underline;}
a:visited, span.MsoHyperlinkFollowed
        {mso-style-priority:99;
        color:#954F72;
        text-decoration:underline;}
span.EmailStyle17
        {mso-style-type:personal-compose;
        font-family:"Calibri",sans-serif;
        color:windowtext;}
span.apple-tab-span
        {mso-style-name:apple-tab-span;}
.MsoChpDefault
        {mso-style-type:export-only;
        font-family:"Calibri",sans-serif;}
@page WordSection1
        {size:8.5in 11.0in;
        margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
        {page:WordSection1;}
--></style>
</head>
<body lang="EN-US" link="#0563C1" vlink="#954F72">
<div class="WordSection1">
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">*********************************************************************</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span class="apple-tab-span"><span style="font-family:"Roboto Mono",serif;color:#222222">           
</span></span><span style="font-family:"Roboto Mono",serif;color:#222222">                   Call for Participation</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">                    The 2019 SUMO Challenge Workshop:</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span class="apple-tab-span"><span style="font-family:"Roboto Mono",serif;color:#222222">           
</span></span><span style="font-family:"Roboto Mono",serif;color:#222222">         360° Indoor Scene Understanding and Modeling</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">                      in Association with CVPR 2019
</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">*********************************************************************</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">Webpage:
</span><span style="font-family:"-webkit-standard",serif;color:black"><a href="http://deepglobe.org/"><span style="font-family:"Roboto Mono",serif;color:#954F72">https://sumochallenge.org</span></a><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">Workshop Schedule: June 16 or 17, 2017 (exact date TBD)</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">Location: Long Beach, CA, USA</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">In computer vision, scene understanding and modeling encapsulate a diverse set of research problems, ranging from low-level geometric modeling
 (e.g., SLAM algorithms) to 3D room layout estimation. These tasks are often addressed separately, yielding only a constrained understanding and representation of the underlying scene. In parallel, the popularity of 360° cameras has encouraged the digitization
 of the real world into augmented and virtual realities, enabling new applications such as virtual social interactions and semantically leveraged augmented reality. This workshop aims to promote comprehensive 3D scene understanding and modeling algorithms that
 create integrated scene representations (with geometry, appearance, semantics, and perceptual qualities), while utilizing 360° imagery to encourage research on its unique challenges.
</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">The SUMO Challenge, in conjunction with the workshop, provides a dataset and an evaluation platform to assess and compare such scene understanding
 approaches that generate complete 3D representations with textured 3D models, pose, and semantics. The datasets created and released for this competition may serve as reference benchmarks for future research in 3D scene understanding.</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">*** Call for Papers:</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">The workshop is soliciting papers covering various problems related to 3D and 360° scene understanding and modeling from RGB and RGB-D imagery.
 The topics mainly focus on indoor scene modeling and include, but are not limited to:</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* 360° data processing and scene understanding</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* Object detection</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* Object localization</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* Layout estimation</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* “Stuff” detection and modeling</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* Instance segmentation</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* Object completion and 3D reconstruction</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* Object pose estimation</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* Generative models
</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* Articulated object modeling</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* Texture and appearance modeling</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* Material property estimation</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* Lighting recognition</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-size:11.5pt;font-family:"Roboto Mono",serif;color:#212121;background:white">Submissions must be written in English and must be sent in PDF format. Each submitted paper must be no
 longer than four pages, excluding references. Please refer to the CVPR author submission guidelines for instructions regarding formatting, templates, and policies. The CVPR submission guidelines can be found at
</span><span style="font-family:"-webkit-standard",serif;color:black"><a href="http://cvpr2019.thecvf.com/submission/main_conference/author_guidelines"><span style="font-size:11.5pt;font-family:"Roboto Mono",serif;color:#1155CC">http://cvpr2019.thecvf.com/submission/main_conference/author_guidelines</span></a></span><span style="font-size:11.5pt;font-family:"Roboto Mono",serif;color:#212121;background:white">.
 The review process will be double blind, in that the authors will not know the names of the reviewers, and the reviewers will not know the names of the authors.
</span><span style="font-family:"Roboto Mono",serif;color:#222222">Also, selected papers will be published in the IEEE CVPRW proceedings, visible in IEEE Xplore and on the CVF Website.</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">*** SUMO Challenge Details:</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">The SUMO Challenge provides 360° indoor RGBD images to benchmark RGBD-to-3D semantic scene modeling approaches. Challenge participants are
 asked to derive a complete, instance-based 3D representation of a scene, based on the given the RGBD input. The complete representation should highlight geometric instances, appearance, and semantics. Live scores, submission and evaluation of the results,
 and the datasets will be maintained on the workshop website. The challengers will also be required to submit a short paper (up to 4 pages) detailing their methodology, which can be extended as a full paper for further publication. The challenge tracks are:</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* 3D Bounding Box Track</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* 3D Voxel Track</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">* 3D Mesh Track</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">The winner in each category will receive cash and equipment prizes in addition to being invited to the workshop to give an oral presentation.
 Runners up and accepted submissions will be invited for a poster presentation.  </span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">*** Important dates:</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">Feb 7, 2019</span><span class="apple-tab-span"><span style="font-family:"Roboto Mono",serif;color:#222222">                
</span></span><span style="font-family:"Roboto Mono",serif;color:#222222">Challenge launch</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">March 15, 2019        Paper submission deadline & challenge checkpoint</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">April 3, 2019         Notification to authors</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">April 10, 2019</span><span class="apple-tab-span"><span style="font-family:"Roboto Mono",serif;color:#222222">             
</span></span><span style="font-family:"Roboto Mono",serif;color:#222222">Camera-ready deadline</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">April 10, 2019        Challenge deadline</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">June 16 or 17, 2019   SUMO Workshop and Challenge</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">*** Organizing Committee:</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"-webkit-standard",serif;color:black"> <o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">Daniel Huber, Facebook</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">Lyne Tchapmi, Stanford University</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">Frank Dellaert, Georgia Tech</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">Ilke Demir, DeepScale</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">Shuran Song, Princeton University</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt;background:white"><span style="font-family:"Roboto Mono",serif;color:#222222">Rachel Luo, Stanford University</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-family:"-webkit-standard",serif;color:black"><o:p> </o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt"><span style="font-family:"Roboto Mono",serif;color:black">*** Contact:</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-family:"-webkit-standard",serif;color:black"><o:p> </o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt"><span style="font-family:"Roboto Mono",serif;color:black">Daniel Huber -
</span><span style="font-family:"-webkit-standard",serif;color:black"><a href="mailto:dhuber@fb.com"><span style="font-family:"Roboto Mono",serif;color:#1155CC">dhuber@fb.com</span></a><o:p></o:p></span></p>
<p style="margin:0in;margin-bottom:.0001pt"><span style="font-family:"Roboto Mono",serif;color:black">Lyne Tchapmi -
</span><span style="font-family:"-webkit-standard",serif;color:black"><a href="mailto:lynetcha@stanford.edu"><span style="font-family:"Roboto Mono",serif;color:#1155CC">lynetcha@stanford.edu</span></a></span><span style="font-family:"Roboto Mono",serif;color:black">
</span><span style="font-family:"-webkit-standard",serif;color:black"><o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:11.0pt"><o:p> </o:p></span></p>
</div>
</body>
</html>