[visionlist] Adaptive Shot Learning for Gesture Understanding and Production (in conjunction with IEEE FG 2017) - CFP

Juan Wachs juan.wachs at gmail.com
Fri Jan 6 15:28:27 EST 2017


Apologies for multiple copies
************************************************************
CALL FOR PAPERS

1st International Workshop on Adaptive Shot Learning for Gesture
Understanding and Production
ASL4GUP 2017
In conjunction with IEEE FG 2017
May 30, 2017, Washington DC, USA
https://engineering.purdue.edu/ASL4GUP

Contact: jpwachs at purdue.edu
________________
IMPORTANT DATES
________________


Submission Deadline: Feb 1, 2017
Notification of Acceptance: March 1, 2017
Camera Ready: March 8, 2017
Workshop: May 30, 2017

________________
SCOPE
________________

In the aim of natural interaction with machines, a framework must be
developed to include the adaptability humans portray to understand gestures
from context, from a single observation or from multiple observations. This
is also referred as adaptive shot learning Ð the ability to adapt the
mechanism of recognition to a barely seen gesture, well-known or entirely
unknown. Of particular interest to the community are zero-shot and one-shot
learning, given that most work has been done in the N-shot learning
scenario. The workshop aims to encourage works that focus on the way in
which humans produce gestures Ð the kinematic and biomechanical
characteristics, and the cognitive process involved when perceiving,
remembering and replicating them. We invite submission of papers presenting
original research in the aforementioned theme.

________________
TOPICS
________________


Topics of interest (but not limited to):
* One and zero shot recognition;
* Gesture production from context or single observation;
* EEG based gesture recognition
* Context modeling from gesture languages;
* Holistic approaches to gesture modeling and recognition;
* Human-like gesture production and recognition;
* Gesture based robotic control and interfaces

________________
SUBMISSIONS
________________


Submissions may be up to 8 pages, in accordance with the IEEE FG conference
format. Papers longer than six pages will be subject to a page fee (100 USD
per page) for the extra pages (two max). We welcome regular, position and
applications papers.  Submission through:
https://easychair.org/conferences/?conf=asl4gup2017

Accepted papers will be included in the Proceedings of IEEE FG 2017 &
Workshops and will be sent for inclusion into the IEEE Xplore digital
library. Selected papers will be also invited for a full submission to a
special issue in a leading journal in the field of machine learning and
cognition.

________________
ORGANIZERS
________________

Juan P Wachs (Purdue University, USA); jpwachs at purdue.edu
Richard Voyles (Purdue University, USA); rvoyles at purdue.edu
Susan Fussell (Cornell University, USA); sfussell at cornell.edu
Isabelle Guyon (UniversitŽ Paris-Saclay, France); guyon at clopinet.com
Sergio Escalera (Computer Vision Center and University of Barcelona,
Spain); sergio.escalera.guerrero at gmail.com

________________
Program Committee
________________

1. Nadia Bianchi-Berthouze, University College London, UK (confirmed)
2. Albert Ali Salah, Bogazici University, Turkey  (confirmed)
3. Adar Pelah, University of York, UK   (confirmed)
4. Hugo Jair Escalante, INAOE, Mexico   (confirmed)
5. Jun Wan, Insitute of Automation, Chinese Academy of Sciences, China
(confirmed)
6. Miriam Zacksenhouse, Technion, Israel (confirmed)
7. Marta Mejail, Universidad de Buenos Aires (UBA), Argentina (confirmed)
8. Tamar Flash, The Weizmann Institute of Science, Israel (confirmed)
9. Luigi Gallo, Institutes of the National Research Council, Italy
(confirmed)
10. Mathew Turk, University of California, Santa Barbara, USA (confirmed)
11. Daniel Gill, University of Winchester (UK) (confirmed)
12. Ray Perez, Office of Naval Research (ONR) – USA (confirmed)
13. Daniel Foti (Purdue) – USA (confirmed)
14. Yael Edan (Ben-Gurion University of the Negev) – Israel (confirmed).
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://visionscience.com/pipermail/visionlist_visionscience.com/attachments/20170106/c6febde8/attachment.html>


More information about the visionlist mailing list