<div><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">* <b style="font-size:1rem"><u><a href="http://www.google.com/url?q=http%3A%2F%2Fwww.shrec.net%2F&sa=D&sntz=1&usg=AFQjCNG1JtGV4ZLdGfDXOvK-1Dv6PZEejQ" style="color:blue;font-size:1rem" target="_blank">SHREC 2022</a></u> Track: Sketch-Based 3D Shape Retrieval in the Wild</b></span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"> </span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">* Website:</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"><a href="https://sites.google.com/site/firmamentqj/sbsrw" style="color:blue;font-size:1rem" target="_blank">https://sites.google.com/site/firmamentqj/sbsrw</a></span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><br></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm"><font face="Arial, sans-serif" style="font-size:1rem">* Registration Deadline: <b style="font-size:1rem">January 22</b></font></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm"><font face="Arial, sans-serif"><br></font></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">* Organizers:</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">- Jie Qin, Nanjing University of Aeronautics and Astronautics, Nanjing, China</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">- Shuaihang Yuan, New York University, New York, USA</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">- Jiaxin Chen, Beihang University, Beijing, China</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">- Boulbaba Ben Amor - IMT Nord Europe, France & Inception Institute of Artificial Intelligence, UAE</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">- Yi Fang, NYU Abu Dhabi, UAE and NYU Tandon, USA</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"> </span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">============================ Objective =================================</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">The objective of this track is to evaluate the performance of different sketch-based 3D shape retrieval algorithms based on a 2D free-hand sketch dataset and a 3D shape dataset in a more realistic and challenging setting.</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"> </span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">============================ Introduction ================================</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">Sketch-based 3D shape retrieval (SBSR) [1-3] has drawn a significant amount of attention, owing to the succinctness of free-hand sketches and the increasing demands from real applications. It is an intuitive yet challenging task due to the large discrepancy between the 2D and 3D modalities.</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"> </span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">To foster the research on this important problem, several tracks focusing on related tasks have been held in the past SHREC challenges, such as [4-7]. However, the datasets they adopted are not quite realistic, and thus cannot well simulate real application scenarios. To mimic the real-world scenario, the dataset is expected to meet the following requirements. First, there should exist a large domain gap between the two modalities,<i style="font-size:1rem"> i.e.</i>, sketches and 3D shapes. However, current datasets unintentionally narrow this gap by using projection-based/multi-view representations for 3D shapes (<i style="font-size:1rem">i.e.</i>, a 3D shape is manually rendered into a set of 2D images). In this way, the large 2D-3D domain discrepancy is unnecessarily reduced to the 2D-2D one. Second, the data themselves from both modalities should be realistic, mimicking the real-world scenario. More specifically, we need a full variety of sketches per category as real users possess various drawing skills. As for 3D shapes, we need to frame 3D models with real-world settings more than create them artificially. However, human sketches on existing datasets tend to be semi-photorealistic drawn by experts and the number of sketches per category is quite limited; in the meantime, most current 3D datasets used in SBSR are composed of CAD models, losing certain details compared to the models scanned from real objects.</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"> </span></p><p class="MsoNormal" dir="auto" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">To circumvent the above limitations, this track proposes a more realistic and challenging setting for SBSR. On the one hand, we adopt highly abstract 2D sketches drawn by amateurs, and at the same time, bypass the projection-based representations for 3D shapes by directly adopting and representing 3D point cloud data. On the other hand, we adopt a full variety of free-hand sketches with various samples per category, as well as a collection of realistic point cloud data framed from indoor objects. Therefore, we name this track ‘sketch-based 3D shape retrieval in the wild’ (SBSRW). As stated above, the term ‘in the wild’ is reflected in two perspectives: 1) The domain gap between the two modalities is realistic as we adopt sketches of high abstraction levels and 3D point cloud data. 2) The data themselves mimic the real-world setting as we adopt a full variety of sketches and 3D point clouds captured from real objects.</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"> </span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">======================= Tasks ===========================</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">We proposed two tasks to evaluate the performance of different SBSR algorithms,<i style="font-size:1rem"> i.e.</i>, sketch-based 3D CAD model (point cloud data) retrieval and sketch-based realistic scanned model (point cloud data) retrieval.</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"> </span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">======================= Evaluation Method ===========================</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">For a comprehensive evaluation of different algorithms, we employ the following widely-adopted performance metrics in SBSR, including nearest neighbor (NN), first tier (FT), second tier (ST), E-measure (E), discounted cumulated gain (DCG), mean average precision (mAP), and precision-recall (PR) curve. We will provide the source code to compute all the aforementioned metrics.</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"> </span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">======================= Procedure ===========================</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">The following list is a step-by-step description of the activities:</span></p><ul type="square" style="color:rgb(49,49,49);word-spacing:1px;margin-top:0cm;margin-bottom:0cm"><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">The participants register the track by sending an email to <u><a href="mailto:qinjiebuaa@gmail.com" style="color:blue;font-size:1rem" target="_blank">qinjiebuaa@gmail.com</a></u> with 'SHREC 2022 - SBSRW Track Registration' as the title and indicating which task they are interested in.</span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">The organizers release the dataset via their website.</span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">The participants submit the distance matrices for the test sets, with one-page descriptions of their methods.</span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">Evaluation is automatically performed based on the submitted matrices, by computing all the performance metrics via the official source code.</span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">The organizers announce the results and the final rank list of all the participants.</span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">The track results are combined into a joint paper, which is subject to a two-stage peer review process. Accepted papers will be published in Computers & Graphics.</span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">The description of the track and the results will be presented at Eurographics 2022 Symposium on 3D Object Retrieval (1-2 September 2022).</span></li></ul><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"> </span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">======================= Schedule ===========================</span></p><ul type="square" style="color:rgb(49,49,49);word-spacing:1px;margin-top:0cm;margin-bottom:0cm"><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">January 1: Call for participation.</span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">January 15: Release a few sample sketches and 3D models.</span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"><b style="font-size:1rem">January 22: Registration deadline.</b></span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">January 29: Release the training set for the first task.</span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">February 5: Release the training set for the second task.</span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">February 28: Submission deadline for the first task.</span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">March 4: Submission deadline for the second task.</span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">March 8: Release the final results for both tasks; jointly write the track report.</span></li><li class="MsoNormal" style="margin:0cm;font-size:12pt;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">March 15: Submission deadline for the joint paper for C&G review.</span></li></ul><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"> </span><span style="font-family:Arial,sans-serif;font-size:12pt"> </span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">We look forward to your participation!</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"> </span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">Best Regards,</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"> </span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">Jie Qin</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif"> </span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">Professor</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">College of Computer Science and Technology</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">Nanjing University of Aeronautics and Astronautics (NUAA)</span></p><p class="MsoNormal" style="font-size:12pt;color:rgb(49,49,49);word-spacing:1px;margin:0cm;font-family:宋体"><span lang="EN-US" style="font-family:Arial,sans-serif;font-size:1rem">Nanjing, Jiangsu 211106, China</span></p>
</div>