[visionlist] [CFP] The 3rd Facial Micro-Expression Grand Challenge (MEGC) Workshop @ IEEE FG 2020

Moi Hoon Yap M.Yap at mmu.ac.uk
Tue Jan 7 19:54:31 -04 2020


[Happy New Year! Apologies if you receive multiple copies of this CFP]

CALL FOR PAPERS:

The 3rd Facial Micro-Expression Grand Challenge (MEGC) Workshop: New Learning Methods for Spotting and Recognition
in conjunction with the 15th IEEE Conference on Automatic Face and Gesture Recognition (FG) 2020
- Buenos Aires, Argentina, 18-22 May 2020.
http://megc2020.psych.ac.cn:81

Facial micro-expressions (MEs) are involuntary movements of the face that occur spontaneously when a person experiences an emotion but attempts to suppress or repress the facial expression, typically found in a high-stakes environment. As such, the duration of MEs is very short with the general duration of not more than 500 milliseconds (ms), and is the telltale sign that distinguishes them from a normal facial expression. Computational analysis and automation of tasks on micro-expressions is an emerging area in face research, with a strong interest appearing as recent as 2014. Only recently, the availability of a few spontaneously induced facial micro-expression datasets has provided the impetus to advance further from the computational aspect. This is the third edition of this workshop, which aims to promote interactions between researchers and scholars not only from within this niche area of facial micro-expression research, but also including those from broader, general areas of expression and psychology research.

This workshop has two main agendas:
1. To organize the Grand Challenge for facial micro-expression research: spotting macro- and micro expressions on long videos: CAS(ME)2<https://ieeexplore.ieee.org/abstract/document/7820164/> and SAMM Long Videos.<https://arxiv.org/abs/1911.01519>
   NEW Updates: Baseline Method and results is available: Click here<https://arxiv.org/abs/1912.11985>
2.To solicit original works that addresses a variety of challenges in ME research, but not limited to:
- ME Spotting / Detection
- ME Recognition
- ME Feature Representations & Computational Analysis
- Unified ME Spot-and-Recognize schemes
- Deep Learning Techniques for ME Analysis
- ME Data Synthesis
- New ME Datasets
- Real-world Applications of ME
- Computational Linkages to Psychology / Neuroscience in MEs

SUBMISSIONS
Submission site is open, and accessible at:
https://cmt3.research.microsoft.com/MEGC2020

Each paper will be reviewed by at least two reviewers from the TPC or external experts in "double-blind" fashion, and so the submitted version of the paper should be appropriately anonymized not to reveal the authors or authors’ institutions. The submitted papers should present original work, not currently under review elsewhere and should have no substantial overlap with already published work. All challenge entries should be accompanied by a paper submission.

Submissions (in PDF) can either be in the form of a short paper (max 4 pages + 1 page for reference) or long paper (no more than 8 pages including references) in IEEE FG 2019 paper format. For further instructions: https://fg2020.org/instructions-of-paper-submission-for-review/. Workshop papers presented at FG 2019 will appear in the IEEE Xplore digital library. If a paper is accepted, it is assumed that an author will register and attend the workshop to present the paper. (Papers that are not presented will not be published in the proceedings.)

Dates
06 February 2020: Submission deadline
14 February 2020: Notification of acceptance
28 February 2020:  Camera ready submission
Organizing Chairs
Sujing Wang, Chinese Academy of Sciences, China
Moi Hoon Yap, Manchester Metropolitan University, UK
John See, Multimedia University, Malaysia
Xiaopeng Hong, Xi’an Jiaotong University, China
Xiaobai Li, University of Oulu, Finland

Advisory Panel
Xiaolan Fu, Chinese Academy of Sciences, China
Guoying Zhao, University of Oulu, Finland

===
For further enquiries, please contact Sujing Wang (wangsujing at psych.ac.cn<mailto:wangsujing at psych.ac.cn>) or Moi Hoon Yap (m.yap at mmu.ac.uk<mailto:m.yap at mmu.ac.uk>)
===

Many thanks

Best regards
Moi Hoon

Dr. Moi Hoon Yap
Reader in Computer Vision
Lead, Human-Centred Computing
Address:
Manchester Metropolitan University | John Dalton Building (E129) | Chester Street | Manchester | M1 5GD
Telephone: (+44) 0161 247 1503 | Facsimile: (+44) 0161 247 6840
Website: http://www2.docm.mmu.ac.uk/STAFF/M.Yap/

We welcome your participation:
The Third Facial Micro-expressions Grand Challenge: http://megc2020.psych.ac.cn:81/

Our research are reproducible, datasets and software are available at:
http://www2.docm.mmu.ac.uk/STAFF/M.Yap/dataset.php



"Before acting on this email or opening any attachments you should read the Manchester Metropolitan University email disclaimer available on its website http://www.mmu.ac.uk/emaildisclaimer "
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20200107/7b24f89c/attachment-0001.html>


More information about the visionlist mailing list