<div dir="ltr"><div dir="ltr">Dear researcher,<div>  The submission deadline has been extended to 15. Jan, 2021</div><div><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700">Aim and Scopes</span></p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)">Object detection is one of the most challenging and important tasks of computer vision and is widely used in applications such as autonomous vehicle, biometrics, video surveillance, and human-machine interactions. In the past five years, significant success has been achieved with the development of deep learning, especially deep convolutional neural networks. Typical categories of advanced object detection methods are one-stage, two-stage, and anchor-free methods. Nevertheless, the performance in accuracy and efficiency is far from satisfying. On the one hand, the average precision of state-of-the-art object detection methods is very low (e.g., merely about 40% on the COCO dataset). The performance is even worse for small and occluded objects. On the another hand, to obtain precision the detection speed is very low. It is challenging to get a satisfying trade-off between the detection precision and speed. Therefore, much efforts have to be engaged to remarkably improve the performance of object detection in both precision and efficiency.</p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)">This special issue will publish papers presenting state-of-the-art methods in dealing with the challenging problems of object detection within the framework of deep learning. We invite authors to submit manuscripts that are highly related to the topics of this special issue and which have not been published before. The topics of interest include, but are not limited to<span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700">:</span></p><ul style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:1.625rem;list-style-position:outside;font-family:NexusSans,"Helvetica Neue",Helvetica,Arial,sans-serif;font-size:18.8px;color:rgb(80,80,80)"><li style="margin:0px 0px 0.5rem 1rem;box-sizing:border-box;padding:0px 0px 20px;vertical-align:baseline;list-style:none;font-size:1.25rem;line-height:1.4;font-family:NexusSerif,Georgia,serif;clear:both;float:left;width:598.469px"> Anchor and Anchor-free object detection</li><li style="margin:0px 0px 0.5rem 1rem;box-sizing:border-box;padding:0px 0px 20px;vertical-align:baseline;list-style:none;font-size:1.25rem;line-height:1.4;font-family:NexusSerif,Georgia,serif;clear:both;float:left;width:598.469px"> Detecting small or occluded objects</li><li style="margin:0px 0px 0.5rem 1rem;box-sizing:border-box;padding:0px 0px 20px;vertical-align:baseline;list-style:none;font-size:1.25rem;line-height:1.4;font-family:NexusSerif,Georgia,serif;clear:both;float:left;width:598.469px"> Context and attention mechanism for object detection</li><li style="margin:0px 0px 0.5rem 1rem;box-sizing:border-box;padding:0px 0px 20px;vertical-align:baseline;list-style:none;font-size:1.25rem;line-height:1.4;font-family:NexusSerif,Georgia,serif;clear:both;float:left;width:598.469px"> Fast object detection algorithms</li><li style="margin:0px 0px 0.5rem 1rem;box-sizing:border-box;padding:0px 0px 20px;vertical-align:baseline;list-style:none;font-size:1.25rem;line-height:1.4;font-family:NexusSerif,Georgia,serif;clear:both;float:left;width:598.469px"> New backbone for object detection</li><li style="margin:0px 0px 0.5rem 1rem;box-sizing:border-box;padding:0px 0px 20px;vertical-align:baseline;list-style:none;font-size:1.25rem;line-height:1.4;font-family:NexusSerif,Georgia,serif;clear:both;float:left;width:598.469px"> Architecture search for object detection</li><li style="margin:0px 0px 0.5rem 1rem;box-sizing:border-box;padding:0px 0px 20px;vertical-align:baseline;list-style:none;font-size:1.25rem;line-height:1.4;font-family:NexusSerif,Georgia,serif;clear:both;float:left;width:598.469px"> 3D object detection</li><li style="margin:0px 0px 0.5rem 1rem;box-sizing:border-box;padding:0px 0px 20px;vertical-align:baseline;list-style:none;font-size:1.25rem;line-height:1.4;font-family:NexusSerif,Georgia,serif;clear:both;float:left;width:598.469px"> Object detection in challenging conditions</li><li style="margin:0px 0px 0.5rem 1rem;box-sizing:border-box;padding:0px 0px 20px;vertical-align:baseline;list-style:none;font-size:1.25rem;line-height:1.4;font-family:NexusSerif,Georgia,serif;clear:both;float:left;width:598.469px"> Handling scale problems in object detection</li><li style="margin:0px 0px 0.5rem 1rem;box-sizing:border-box;padding:0px 0px 20px;vertical-align:baseline;list-style:none;font-size:1.25rem;line-height:1.4;font-family:NexusSerif,Georgia,serif;clear:both;float:left;width:598.469px"> Improving localization accuracy</li><li style="margin:0px 0px 0.5rem 1rem;box-sizing:border-box;padding:0px 0px 20px;vertical-align:baseline;list-style:none;font-size:1.25rem;line-height:1.4;font-family:NexusSerif,Georgia,serif;clear:both;float:left;width:598.469px"> Fusion of point cloud and images for object detection</li><li style="margin:0px 0px 0.5rem 1rem;box-sizing:border-box;padding:0px 0px 20px;vertical-align:baseline;list-style:none;font-size:1.25rem;line-height:1.4;font-family:NexusSerif,Georgia,serif;clear:both;float:left;width:598.469px"> Relationship between object detection and other computer vision tasks.</li><li style="margin:0px 0px 0.5rem 1rem;box-sizing:border-box;padding:0px 0px 20px;vertical-align:baseline;list-style:none;font-size:1.25rem;line-height:1.4;font-family:NexusSerif,Georgia,serif;clear:both;float:left;width:598.469px"> Large-scale datasets for object detection</li></ul><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700"><br></span></p><p style="box-sizing:border-box;margin:1.5rem 0px;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700">Important Dates</span></p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)">Submission period: Jan. 15, 2021</p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)">First notification to authors: Mar. 1, 2021</p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)">Submission of revised papers: Apr. 15, 2021</p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)">Final notification to authors: June 15, 2021</p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)">Online publication: Jul. 1, 2021</p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700">Submission of Manuscripts</span></p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)">Prospective authors should write manuscripts according to the Guide for Authors of Pattern Recognition Letters available at the website <a href="https://ees.elsevier.com/prletters/" rel="external" target="_blank" style="color:rgb(0,115,152);box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;background:0px 0px;text-decoration-line:none;word-break:break-word;overflow:hidden;border-bottom:none">https://ees.elsevier.com/prletters/</a>. Please use article type name by: <span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700">VSI:DL4PEOD</span>.</p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)"><span style="box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;font-weight:700">Guest Editors</span></p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)">Dr. Yanwei Pang, Tianjin University, China, <a href="http://nonsolus/st@r/admin/tasks/pyw@tju.edu.cn" rel="external" target="_blank" style="color:rgb(0,115,152);box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;background:0px 0px;text-decoration-line:none;word-break:break-word;overflow:hidden;border-bottom:none">pyw@tju.edu.cn</a>, MGE</p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)">Dr. Jungong Han, Warwick University, U.K., <a href="http://nonsolus/st@r/admin/tasks/jungong.han@warwick.ac.uk" rel="external" target="_blank" style="color:rgb(0,115,152);box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;background:0px 0px;text-decoration-line:none;word-break:break-word;overflow:hidden;border-bottom:none">jungong.han@warwick.ac.uk</a></p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)">Dr. Xin Lu, Adobe Inc., U.S.A., <a href="mailto:xinl@adobe.com" rel="external" target="_blank" style="color:rgb(0,115,152);box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;background:0px 0px;text-decoration-line:none;word-break:break-word;overflow:hidden;border-bottom:none">xinl@adobe.com</a></p><p style="box-sizing:border-box;margin:0px 0px 1.5rem;padding:0px;vertical-align:baseline;font-family:NexusSerif,Georgia,serif;font-size:18.8px;line-height:1.625rem;max-width:inherit;color:rgb(80,80,80)">Dr. Nicola Conci, University of Trento, Italy, <a href="mailto:nicola.conci@unitn.it" rel="external" target="_blank" style="color:rgb(0,115,152);box-sizing:border-box;margin:0px;padding:0px;vertical-align:baseline;line-height:inherit;background:0px 0px;text-decoration-line:none;word-break:break-word;overflow:hidden;border-bottom:none">nicola.conci@unitn.it</a></p></div></div></div>